firing threshold
- Europe > Germany > Hamburg (0.04)
- Africa > Comoros > Grande Comore > Moroni (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (2 more...)
- Europe > Germany > Hamburg (0.04)
- Africa > Comoros > Grande Comore > Moroni (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (2 more...)
Adversarially Robust Spiking Neural Networks Through Conversion
Özdenizci, Ozan, Legenstein, Robert
Spiking neural networks (SNNs) provide an energy-efficient alternative to a variety of artificial neural network (ANN) based AI applications. As the progress in neuromorphic computing with SNNs expands their use in applications, the problem of adversarial robustness of SNNs becomes more pronounced. To the contrary of the widely explored end-to-end adversarial training based solutions, we address the limited progress in scalable robust SNN training methods by proposing an adversarially robust ANN-to-SNN conversion algorithm. Our method provides an efficient approach to embrace various computationally demanding robust learning objectives that have been proposed for ANNs. During a post-conversion robust finetuning phase, our method adversarially optimizes both layer-wise firing thresholds and synaptic connectivity weights of the SNN to maintain transferred robustness gains from the pre-trained ANN. We perform experimental evaluations in numerous adaptive adversarial settings that account for the spike-based operation dynamics of SNNs, and show that our approach yields a scalable state-of-the-art solution for adversarially robust deep SNNs with low-latency.
Wearable-based Human Activity Recognition with Spatio-Temporal Spiking Neural Networks
Li, Yuhang, Yin, Ruokai, Park, Hyoungseob, Kim, Youngeun, Panda, Priyadarshini
We study the Human Activity Recognition (HAR) task, which predicts user daily activity based on time series data from wearable sensors. Recently, researchers use end-to-end Artificial Neural Networks (ANNs) to extract the features and perform classification in HAR. However, ANNs pose a huge computation burden on wearable devices and lack temporal feature extraction. In this work, we leverage Spiking Neural Networks (SNNs)--an architecture inspired by biological neurons--to HAR tasks. SNNs allow spatio-temporal extraction of features and enjoy low-power computation with binary spikes. We conduct extensive experiments on three HAR datasets with SNNs, demonstrating that SNNs are on par with ANNs in terms of accuracy while reducing up to 94% energy consumption. The code is publicly available in https://github.com/
- North America > United States > Connecticut > New Haven County > New Haven (0.04)
- North America > Canada > Newfoundland and Labrador > Labrador (0.04)
MT-SNN: Spiking Neural Network that Enables Single-Tasking of Multiple Tasks
Cachi, Paolo G., Ventura, Sebastian, Cios, Krzysztof J.
In this paper we explore capabilities of spiking neural networks in solving multi-task classification problems using the approach of single-tasking of multiple tasks. We designed and implemented a multi-task spiking neural network (MT-SNN) that can learn two or more classification tasks while performing one task at a time. The task to perform is selected by modulating the firing threshold of leaky integrate and fire neurons used in this work. The network is implemented using Intel's Lava platform for the Loihi2 neuromorphic chip. Tests are performed on dynamic multitask classification for NMNIST data. The results show that MT-SNN effectively learns multiple tasks by modifying its dynamics, namely, the spiking neurons' firing threshold.
- North America > United States > Virginia (0.05)
- Europe > Spain > Andalusia > Córdoba Province > Córdoba (0.04)
- Europe > Poland > Subcarpathia Province > Rzeszów (0.04)
DIET-SNN: Direct Input Encoding With Leakage and Threshold Optimization in Deep Spiking Neural Networks
Bio-inspired spiking neural networks (SNNs), operating with asynchronous binary signals (or spikes) distributed over time, can potentially lead to greater computational efficiency on event-driven hardware. The state-of-the-art SNNs suffer from high inference latency, resulting from inefficient input encoding, and sub-optimal settings of the neuron parameters (firing threshold, and membrane leak). We propose DIET-SNN, a low latency deep spiking network that is trained with gradient descent to optimize the membrane leak and the firing threshold along with other network parameters (weights). The membrane leak and threshold for each layer of the SNN are optimized with end-to-end backpropagation to achieve competitive accuracy at reduced latency. The analog pixel values of an image are directly applied to the input layer of DIET-SNN without the need to convert to spike-train. The information is converted into spikes in the first convolutional layer where leaky-integrate-and-fire (LIF) neurons integrate the weighted inputs and generate an output spike when the membrane potential crosses the trained firing threshold. The trained membrane leak controls the flow of input information and attenuates irrelevant inputs to increase the activation sparsity in the convolutional and linear layers of the network. The reduced latency combined with high activation sparsity provides large improvements in computational efficiency. We evaluate DIET-SNN on image classification tasks from CIFAR and ImageNet datasets on VGG and ResNet architectures. We achieve top-1 accuracy of 66.52% with 25 timesteps (inference latency) on the ImageNet dataset with 3.1X less compute energy than an equivalent standard ANN. Additionally, DIET-SNN performs 5-100X faster inference compared to other state-of-the-art SNN models.
- North America > United States (0.28)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Virginia > Arlington County > Arlington (0.04)
- North America > United States > Michigan > Washtenaw County > Ann Arbor (0.04)
- North America > United States > Maryland > Montgomery County > Bethesda (0.04)
- (2 more...)
Computer Modeling of Associative Learning
Alkon, Daniel L., Quek, Francis K. H., Vogl, Thomas P.
This paper describes an ongoing effort which approaches neural net research in a program of close collaboration of neurosc i ent i sts and eng i neers. The effort is des i gned to elucidate associative learning in the marine snail Hermissenda crassicornist in which Pavlovian conditioning has been observed. Learning has been isolated in the four neuron network at the convergence of the v i sua 1 and vestibular pathways in this animal t and biophysical changes t specific to learning t have been observed in the membrane of the photoreceptor B cell. A basic charging capacitance model of a neuron is used and enhanced with biologically plausible mechanisms that are necessary to replicate the effect of learning at the cellular level. These mechanisms are nonlinear and are t primarilYt instances of second order control systems (e.g.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Michigan (0.04)
- Europe > United Kingdom > England > Tyne and Wear > Sunderland (0.04)